Web Survey Bibliography
National statistical institutes have to satisfy an ever growing demand for statistical information. At the same time, they face new challenges like increasing nonresponse rates, decreasing budgets, and demands for reducing the response burden. This may lead to new ways of conducting surveys. Some of these new developments are discussed in this paper.
Increasing nonresponse affects the representativity of survey data, and therefore the quality of survey outcomes. A new indicator (the R-indicator) is described that measures the representativity of survey response. Such an indicator can be a useful additional indicator for survey quality. It may be applied during the fieldwork of the survey to focus data collection efforts. It may also be useful to compare a survey over time, or to compare surveys in different countries.
National statistical offices have to produce reliable and accurate statistics. This is often done with face-to-face or telephone surveys to collect the data that form the basis for these statistics. This is an expensive way of survey data collection, but experience has shown that it is necessary in order to obtain high quality data. Now that many these offices are faces with reduced budgets, web surveys may offer a less costly alternative. This type of survey becomes increasingly popular, but also has its methodological drawbacks. The question is addressed whether web surveys can be used effectively in official statistics, whether as a single mode survey, or as one of the modes in a mixed-mode surveys.
A mixed-mode survey may also be a means to reduce nonresponse rates. Response behaviour may depend on the data collection mode. By approaching people with the mode most fit for them, they may be more inclined to respond. Responsive survey designs aim at altering the data collection procedures during the fieldwork. These could mean focusing fieldwork on a specific group with a specific mode. Such an approaching requires information on the progress of the fieldwork. Paradata are required for this. This stresses the increasingly important role of paradata.
Conference homepage (abstract)
Web survey bibliography - Bethlehem, J. (21)
- The perils of non-probability sampling; 2017; Bethlehem, J.
- Sunday shopping – The case of three surveys; 2016; Bethlehem, J.
- Solving the Nonresponse Problem With Sample Matching?; 2016
- Using Web Panels for Official Statistics; 2014; Bethlehem, J.
- Web Surveys in Official Statistics; 2014; Bethlehem, J.
- Online panel research: History, concepts, applications and a look at the future; 2014; Callegaro, M., Baker, R., Bethlehem, J., Goeritz, A., Krosnick, J. A., Lavrakas, P. J.
- Using response probabilities for assessing representativity; 2012; Bethlehem, J.
- Web Surveys: Methodological Problems and Research Perspectives; 2012; Biffignandi, S., Bethlehem, J.
- Can web surveys provide an adequate alternative to phone and face to face surveys?; 2011; Bethlehem, J.
- Selection Bias in Web Surveys; 2010; Bethlehem, J.
- Can we make official statistics with self-selection web surveys?; 2009; Bethlehem, J.
- The rise of survey sampling; 2009; Bethlehem, J.
- New developments in survey methodology for official statistics; 2009; Bethlehem, J.
- Indicators for the representativeness of survey response; 2009; Schouten, B., Cobben, F., Bethlehem, J.
- Use of Web surveys in Official Statistics; 2009; Bethlehem, J.
- Applied Survey Methods: A Statistical Perspective (Wiley Series in Survey Methodology); 2009; Bethlehem, J.
- Nonresponse Bias in Surveys; 2009; Bethlehem, J., Vehovar, V., Stoop, I., Schouten, B., Shlomo, N., Skinner, C., Montaquila, J.
- Representativity of web surveys – an illusion?; 2008; Bethlehem, J.
- How accurate are self-selection web surveys?; 2008; Bethlehem, J.
- Blaise – Alive and kicking for 20 years; 2006; Bethlehem, J., Hofman, L.
- Methodological guidelines for Blaise web surveys ; 2003; Bethlehem, J., Hoogendoorn, A.